H∞ bounds for least-squares estimators

نویسندگان

  • Babak Hassibi
  • Thomas Kailath
چکیده

In this paper we obtain upper and lower bounds for the H 1 norm of the Kalman lter and RLS algorithm, with respect to prediction and ltered errors. These bounds can be used to study the robustness properties of such estimators. One main conclusion is that, unlike H 1-optimal estimators which do not allow for any ampliication of the disturbances, the least-squares estimators do allow for such ampliication. This fact can be especially pronounced in the prediction error case, whereas in the ltered error case the energy ampliication is at most four. Moreover, it is shown that the H 1 norm for RLS is data-dependent, whereas for LMS and normalized LMS the H 1 norm is simply unity.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variance bounds for estimators in autoregressive models with constraints

We consider nonlinear and heteroscedastic autoregressive models whose residuals are martingale increments with conditional distributions that fulfill certain constraints. We treat two classes of constraints: residuals depending on the past through some function of the past observations only, and residuals that are invariant under some finite group of transformations. We determine the efficient ...

متن کامل

Bounds on the prediction error of penalized least squares estimators with convex penalty

This paper considers the penalized least squares estimator with arbitrary convex penalty. When the observation noise is Gaussian, we show that the prediction error is a subgaussian random variable concentrated around its median. We apply this concentration property to derive sharp oracle inequalities for the prediction error of the LASSO, the group LASSO and the SLOPE estimators, both in probab...

متن کامل

Mixing least-squares estimators when the variance is unknown

We propose a procedure to handle the problem of Gaussian regression when the variance is unknown. We mix least-squares estimators from various models according to a procedure inspired by that of Leung and Barron [IEEE Trans. Inform. Theory 52 (2006) 3396–3410]. We show that in some cases, the resulting estimator is a simple shrinkage estimator. We then apply this procedure to perform adaptive e...

متن کامل

Mixing Least - Squares Estimators

We propose a procedure to handle the problem of Gaussian regression when the variance is unknown. We mix least-squares estimators from various models according to a procedure inspired by that of Leung and Barron [17]. We show that in some cases the resulting estimator is a simple shrinkage estimator. We then apply this procedure in various statistical settings such as linear regression or adapt...

متن کامل

On-line Parameter Interval Estimation Using Recursive Least Squares

A bank of recursive least-squares (RLS) estimators is proposed for the estimation of the uncertainty intervals of the parameters of an equation error model (or RLS model) where the equation error is assumed to lie between known upper and lower bounds. It is shown that the off-line least-squares method gives the maximum and minimum parameter values that could have produced the recorded input-out...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Automat. Contr.

دوره 46  شماره 

صفحات  -

تاریخ انتشار 2001